Preprocessing for Determining the Difficulty of and Selecting a Solution Strategy for Nonconvex Optimization Problems
نویسندگان
چکیده
Based originally on work of McCormick, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This process decomposes the objective and constraints (if any) into convex and nonconvex unary and binary operations. The convex operations can be approximated arbitrarily well by appending additional constraints, while the domain must somehow be subdivided (in an overall branch and bound process or in some other local process) to handle nonconvex constraints. In general, a problem can be hard if even a single nonconvex term appears. However, certain nonconvex terms lead to easier-to-solve problems than others. In this paper, we present a symbolic preprocessing step that provides a measure of the intrinsic difficulty of a problem. Based on this step, one of two methods can be chosen to relax nonconvex
منابع مشابه
An Efficient Neurodynamic Scheme for Solving a Class of Nonconvex Nonlinear Optimization Problems
By p-power (or partial p-power) transformation, the Lagrangian function in nonconvex optimization problem becomes locally convex. In this paper, we present a neural network based on an NCP function for solving the nonconvex optimization problem. An important feature of this neural network is the one-to-one correspondence between its equilibria and KKT points of the nonconvex optimizatio...
متن کاملAn efficient improvement of the Newton method for solving nonconvex optimization problems
Newton method is one of the most famous numerical methods among the line search methods to minimize functions. It is well known that the search direction and step length play important roles in this class of methods to solve optimization problems. In this investigation, a new modification of the Newton method to solve unconstrained optimization problems is presented. The significant ...
متن کاملValidated Linear Relaxations and Preprocessing: Some Experiments
Based on work originating in the early 1970s, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This process decomposes the objective and constraints (if any) into convex and nonconvex unary and bin...
متن کاملOptimality conditions for approximate solutions of vector optimization problems with variable ordering structures
We consider nonconvex vector optimization problems with variable ordering structures in Banach spaces. Under certain boundedness and continuity properties we present necessary conditions for approximate solutions of these problems. Using a generic approach to subdifferentials we derive necessary conditions for approximate minimizers and approximately minimal solutions of vector optimizatio...
متن کاملSelecting Efficient Service-providers in Electric Power Distribution Industry Using Combinatorial Reverse Auction
In this paper, a combinatorial reverse auction mechanism is proposed for selecting the most efficient service-providers for resolving sustained power interruptions in multiple regions of an electric power distribution company’s responsibility area. Through this mechanism, supplying the required service in each region is assigned to only one potential service-provider considering two criteria in...
متن کامل